The SEO case for competitive analyses
βWe need more links!β βI read that user experience (UX) matters more than everything else in SEO, so we should focus solely on UX split tests.β βWe just need more keywords on these pages.β
If you dropped a quarter on the sidewalk, but had no light to look for it, would you walk to the next block with a street light to retrieve it? The obvious answer is no, yet many marketers get tunnel vision when it comes to where their efforts should be focused.
1942 June 3, Florence Morning News, Mutt and Jeff Comic Strip, Page 7, Florence, South Carolina. (NewspaperArchive)
Which is why Iβm sharing a checklist with you today that will allow you to compare your website to your search competitors, and identify your siteβs strengths, weaknesses, and potential opportunities based on ranking factors we know are important.
If youβre unconvinced that good SEO is really just digital marketing, Iβll let AJ Kohn persuade you otherwise. As any good SEO (or even keyword research newbie) knows, itβs crucial to understand the effort involved in ranking for a specific term before you begin optimizing for it.
Itβs easy to get frustrated when stakeholders ask how to rank for a specific term, and solely focus on content to create, or on-page optimizations they can make. Why? Because weβve known for a while that there are myriad factors that play into search engine rank. Depending on the competitive search landscape, there may not be any amount of βoptimizingβ that you can do in order to rank for a specific term.
The story that Iβve been able to tell my clients is one of hidden opportunity, but the only way to expose these undiscovered gems is to broaden your SEO perspective beyond search engine results page (SERP) position and best practices. And the place to begin is with a competitive analysis.
Competitive analyses help you evaluate your competitionβs strategies to determine their strengths and weaknesses relative to your brand. When it comes to digital marketing and SEO, however, there are so many ranking factors and best practices to consider that can be hard to know where to begin. Which is why my colleague, Ben Estes, created a competitive analysis checklist (not dissimilar to his wildly popular technical audit checklist) that Iβve souped up for the Moz community.
This checklist is broken out into sections that reflect key elements from our Balanced Digital Scorecard. As previously mentioned, this checklist is to help you identify opportunities (and possibly areas not worth your time and budget). But this competitive analysis is not prescriptive in and of itself. It should be used as its name suggests: to analyze what your competitionβs βedgeβ is.
Methodology
Choosing competitors
Before you begin, youβll need to identify six brands to compare your website against. These should be your search competitors (who else is ranking for terms that youβre ranking for, or would like to rank for?) in addition to a business competitor (or two). Donβt know who your search competition is? You can use SEMRush and Searchmetrics to identify them, and if you want to be extra thorough you can use this Moz post as a guide.
Editor’s note: Moz has launched new competitor analysis tools so you can see exactly who your true competitors are. It’s free and instant, so give it a try.
Sample sets of pages
For each site, youβll need to select five URLs to serve as your sample set. These are the pages you will review and evaluate against the competitive analysis items. When selecting a sample set, I always include:
- The brandβs homepage,
- Two βproductβ pages (or an equivalent),
- One to two βbrowseβ pages, and
- A page that serves as a hub for news/informative content.
Make sure each site has equivalent pages to each other, for a fair comparison.
Scoring
The scoring options for each checklist item range from zero to four, and are determined relative to each competitorβs performance. This means that a score of two serves as the average performance in that category.
For example, if each sample set has one unique H1 tag per page, then each competitor would get a score of two for H1s appear technically optimized. However if a site breaks one (or more) of the below requirements, then it should receive a score of zero or one:
- One or more pages within sample set contains more than one H1 tag on it, and/or
- H1 tags are duplicated across a brandβs sample set of pages.
Checklist
Platform (technical optimization)
Title tags appear technically optimized. This measurement should be as quantitative as possible, and refer only to technical SEO rather than its written quality. Evaluate the sampled pages based on:
- Only one title tag per page,
- The title tag being correctly placed within the head tags of the page, and
- Few to no extraneous tags within the title (e.g. ideally no inline CSS, and few to no span tags).
H1s appear technically optimized. Like with the title tags, this is another quantitative measure: make sure the H1 tags on your sample pages are sound by technical SEO standards (and not based on writing quality). You should look for:
- Only one H1 tag per page, and
- Few to no extraneous tags within the tag (e.g. ideally no inline CSS, and few to no span tags).
Internal linking allows indexation of content. Observe the internal outlinks on your sample pages, apart from the sitesβ navigation and footer links. This line item serves to check that the domains are consolidating their crawl budgets by linking to discoverable, indexable content on their websites. Here is an easy-to-use Chrome plugin from fellow Distiller Dom Woodman to see whether the pages are indexable.
To get a score of β2β or more, your sample pages should link to pages that:
- Produce 200 status codes (for all, or nearly all), and
- Have no more than ~300 outlinks per page (including the navigation and footer links).
Schema markup present. This is an easy check. Using Googleβs Structured Data Testing Tool, look to see whether these pages have any schema markup implemented, and if so, whether it is correct. In order to receive a score of β2β here, your sampled pages need:
- To have schema markup present, and
- Be error-free.
Quality of schema is definitely important, and can make the difference of a brand receiving a score of β3β or β4.β Elements to keep in mind are: Organization or Website markup on every sample page, customized markup like BlogPosting or Article on editorial content, and Product markup on product pages.
There is a βhomeβ for newly published content. A hub for new content can be the siteβs blog, or a news section. For instance, Distilledβs βhome for newly published contentβ is the Resources section. While this line item may seem like a binary (score of β0β if you donβt have a dedicated section for new content, or score of β2β if you do), there are nuances that can bring each brandβs score up or down. For example:
- Is the home for new content unclear, or difficult to find? Approach this exercise as though you are a new visitor to the site.
- Does there appear to be more than one βhomeβ of new content?
- If there is a content hub, is it apparent that this is for newly published pieces?
Weβre not obviously messing up technical SEO. This is partly comprised of each brandβs performance leading up to this line item (mainly Title tags appear technically optimized through Schema markup present).
It would be unreasonable to run a full technical audit of each competitor, but take into account your own siteβs technical SEO performance if you know there are outstanding technical issues to be addressed. In addition to the previous checklist items, I also like to use these Chrome extensions from Ayima: Page Insights and Redirect Path. These can provide quick checks for common technical SEO errors.
Content
Title tags appear optimized (editorially). Here is where we can add more context to the overall quality of the sample pagesβ titles. Even if they are technically optimized, the titles may not be optimized for distinctiveness or written quality. Note that we are not evaluating keyword targeting, but rather a holistic (and broad) evaluation of how each competitorβs site approaches SEO factors. You should evaluate each pageβs titles based on the following:
H1s appear optimized (editorially). The same rules that apply to titles for editorial quality also apply to H1 tags. Review each sampled pageβs H1 for:
- A unique H1 tag per page (language in H1 tags does not repeat),
- H1 tags that are discrete from their pageβs title, and
- H1s represent the content on the page.
Internal linking supports organic content. Here you must look for internal outlinks outside of each siteβs header and footer links. This evaluation is not based on the number of unique internal links on each sampled page, but rather on the quality of the pages to which our brands are linking.
While βorganic contentβ is a broad term (and invariably differs by business vertical), here are some guidelines:
- Look for links to informative pages like tutorials, guides, research, or even think pieces.
- The blog posts on Moz (including this very one) are good examples of organic content.
- Internal links should naturally continue the userβs journey, so look for topical progression in each siteβs internal links.
- Links to service pages, products, RSVP, or email subscription forms are not examples of organic content.
- Make sure the internal links vary. If sampled pages are repeatedly linking to the same resources, this will only benefit those few pages.
- This doesnβt mean that you should penalize a brand for linking to the same resource two, three, or even four times over. Use your best judgment when observing the sampled pagesβ linking strategies.
Appropriate informational content. You can use the found βorganic contentβ from your sample sets (and the samples themselves) to review whether the site is producing appropriate informational content.
What does that mean, exactly?
- The content produced obviously fits within the siteβs business vertical, area of expertise, or cause.
- Example: Mozβs SEO and Inbound Marketing Blog is an appropriate fit for an SEO company.
- The content on the site isnβt overly self-promotional, resulting in an average user not trusting this domain to produce unbiased information.
- Example: If Distilled produced a list of βBest Digital Marketing Agencies,β itβs highly unlikely that users would find it trustworthy given our inherent bias!
Quality of content. Highly subjective, yes, but remember: youβre comparing brands against each other. Hereβs what you need to evaluate here:
- Are βinformativeβ pages discussing complex topics under 400 words?
- Do you want to read the content?
- Largely, do the pages seem well-written and full of valuable information?
- Conversely, are the sites littered with βlisticles,β or full of generic info you can find in millions of other places online?
Quality of images/video. Also highly subjective (but again, compare your site to your competitors, and be brutally honest). Judge each siteβs media items based on:
- Resolution (do the images or videos appear to be high quality? Grainy?),
- Whether they are unique (do the images or videos appear to be from stock resources?),
- Whether the photos or videos are repeated on multiple sample pages.
Audience (engagement and sharing of content)
Number of linking root domains. This factor is exclusively based on the total number of dofollow linking root domains (LRDs) to each domain (not total backlinks).
You can pull this number from Mozβs Open Site Explorer (OSE) or from Ahrefs. Since this measurement is only for the total number of LRDs to competitor, you donβt need to graph them. However, you will have an opportunity to display the sheer quantity of links by their domain authority in the next checklist item.
Quality of linking root domains. Here is where we get to the quality of each siteβs LRDs. Using the same LRD data you exported from either Mozβs OSE or Ahrefs, you can bucket each brandβs LRDs by domain authority and count the total LRDs by DA. Log these into this third sheet, and youβll have a graph that illustrates their overall LRD quality (and will help you grade each domain).
Other people talk about our content. I like to use BuzzSumo for this checklist item. BuzzSumo allows you to see what sites have written about a particular topic or company. You can even refine your search to include or exclude certain terms as necessary.
Youβll need to set a timeframe to collect this information. Set this to the past year to account for seasonality.
Actively promoting content. Using BuzzSumo again, you can alter your search to find how many of each domainβs URLs have been shared on social networks. While this isnβt an explicit ranking factor, strong social media marketing is correlated with good SEO. Keep the timeframe to one year, same as above.
Creating content explicitly for organic acquisition. This line item may seem similar to Appropriate informational content, but its purpose is to examine whether the competitors create pages to target keywords users are searching for.
Plug your the same URLs from your found βorganic contentβ into SEMRush, and note whether they are ranking for non-branded keywords. You can grade the competitors on whether (and how many of) the sampled pages are ranking for any non-branded terms, and weight them based on their relative rank positions.
Conversion
You should treat this section as a UX exercise. Visit each competitor’s sampled URLs as though they are your landing page from search. Is it clear what the calls to action are? What is the next logical step in your user journey? Does it feel like youβre getting the right information, in the right order as you click through?
Clear CTAs on site. Of your sample pages, examine what the calls to action (CTAs) are. This is largely UX-based, so use your best judgment when evaluating whether they seem easy to understand. For inspiration, take a look at these examples of CTAs.
Conversions appropriate to several funnel steps. This checklist item asks you to determine whether the funnel steps towards conversion feel like the correct βnext stepβ from the userβs standpoint.
Even if you are not a UX specialist, you can assess each site as though you are a first time user. Document areas on the pages where you feel frustrated, confused, or not. User behavior is a ranking signal, so while this is a qualitative measurement, it can help you understand the UX for each site.
CTAs match user intent inferred from content. Here is where youβll evaluate whether the CTAs match the user intent from the content as well as the CTA language. For instance, if a CTA prompts a user to click βfor more information,β and takes them to a subscription page, the visitor will most likely be confused or irritated (and, in reality, will probably leave the site).
This analysis should help you holistically identify areas of opportunity available in your search landscape, without having to guess which βbest practiceβ you should test next. Once youβve started this competitive analysis, trends among the competition will emerge, and expose niches where your site can improve and potentially outpace your competition.
Kick off your own SEO competitive analysis and comment below on how it goes! If this process is your jam, or youβd like to argue with it, come see me speak about these competitive analyses and the campaigns theyβve inspired at SearchLove London. Bonus? If you use that link, youβll get Β£50 off your tickets.